Weak Learners and Improved Rates of Convergence in Boosting
نویسندگان
چکیده
The problem of constructing weak classifiers for boosting algorithms is studied. We present an algorithm that produces a linear classifier that is guaranteed to achieve an error better than random guessing for any distribution on the data. While this weak learner is not useful for learning in general, we show that under reasonable conditions on the distribution it yields an effective weak learner for one-dimensional problems. Preliminary simulations suggest that similar behavior can be expected in higher dimensions, a result which is corroborated by some recent theoretical bounds. Additionally, we provide improved convergence rate bounds for the generalization error in situations where the empirical error can be made small, which is exactly the situation that occurs if weak learners with guaranteed performance that is better than random guessing can be established.
منابع مشابه
On Weak Base Learners for Boosting Regression and Classiication on Weak Base Learners for Boosting Regression and Classiication
The most basic property of the boosting algorithm is its ability to reduce the training error, subject to the critical assumption that the base learners generate weak hypotheses that are better that random guessing. We exploit analogies between regression and classiication to give a characterization on what base learners generate weak hypotheses, by introducing a geometric concept called the an...
متن کاملSome Results on Weakly Accurate Base Learners for Boosting Regression and Classification
One basic property of the boosting algorithm is its ability to reduce the training error, subject to the critical assumption that the base learners generatèweak' (or more appropriately, `weakly accurate') hypotheses that are better that random guessing. We exploit analogies between regression and classiication to give a characterization on what base learners generate weak hypotheses, by introdu...
متن کاملFast Training of Effective Multi-class Boosting Using Coordinate Descent Optimization
We present a novel column generation based boosting method for multi-class classification. Our multi-class boosting is formulated in a single optimization problem as in [1, 2]. Different from most existing multi-class boosting methods, which use the same set of weak learners for all the classes, we train class specified weak learners (i.e., each class has a different set of weak learners). We s...
متن کاملA Hybrid Framework for Building an Efficient Incremental Intrusion Detection System
In this paper, a boosting-based incremental hybrid intrusion detection system is introduced. This system combines incremental misuse detection and incremental anomaly detection. We use boosting ensemble of weak classifiers to implement misuse intrusion detection system. It can identify new classes types of intrusions that do not exist in the training dataset for incremental misuse detection. As...
متن کاملذخیره در منابع من
با ذخیره ی این منبع در منابع من، دسترسی به آن را برای استفاده های بعدی آسان تر کنید
عنوان ژورنال:
دوره شماره
صفحات -
تاریخ انتشار 2000